Goto

Collaborating Authors

 deepfake pornography


The rise of deepfake pornography in schools: 'One girl was so horrified she vomited'

The Guardian

'It reflects and reinforces a culture where consent and respect for personal boundaries are undermined.' 'It reflects and reinforces a culture where consent and respect for personal boundaries are undermined.' The rise of deepfake pornography in schools: 'One girl was so horrified she vomited' The use of'nudify' apps is becoming more and more prevalent, with hundreds of teachers having seen images created by pupils, often of their peers. He didn't feel this was something he shouldn't be doing. It was in the open and people saw it.


Man fined 340,000 for deepfake pornography of prominent Australian women in first-of-its-kind case

The Guardian

The eSafety commissioner, Julie Inman Grant, took Anthony Rotondo to court in 2023 after he replied to a removal notice, saying it meant nothing to him as he was not an Australian resident. The eSafety commissioner, Julie Inman Grant, took Anthony Rotondo to court in 2023 after he replied to a removal notice, saying it meant nothing to him as he was not an Australian resident. Watchdog applauds'strong message' after federal court orders Gold Coast man Anthony Rotondo to pay for posting deepfake images to a now-defunct website Fri 26 Sep 2025 06.02 EDTLast modified on Fri 26 Sep 2025 06.21 EDT A man who posted deepfake pornographic images of prominent Australian women has been slapped with a hefty fine as a "strong message" in a first-of-its-kind case. The federal court ordered Anthony Rotondo, also known as Antonio, to pay a $343,500 penalty plus costs on Friday after the online regulator eSafety Commissioner brought a case against him almost two years ago. Rotondo admitted to posting the images on a website called MrDeepFakes.com,


Using AI to Humiliate Women: The Men Behind Deepfake Pornography

Der Spiegel International

The whistleblower confirmed to DER SPIEGEL that all Clothoff employees work in countries that used to belong to the Soviet Union. That is consistent with the fact that all of the company's internal communications that DER SPIEGEL has in its possession are completely in Russian, and the company's email service is also based in Russia. The four central players declined to respond to attempts by DER SPIEGEL to contact them for the story published in December 2024. A person named Elias did get in touch, however, claiming to be a spokesperson for the app. He said the four people mentioned above were unknown to him.


From spy cams to deepfake porn: fury in South Korea as women targeted again

The Guardian

For the second time in just a few years, South Korean women took to the streets of Seoul to demand an end to sexual abuse. When the country spearheaded Asia's #MeToo movement, the culprit was molka – spy cams used to record women without their knowledge. Now their fury was directed at an epidemic of deepfake pornography. For Juhee Jin, 26, a Seoul resident who advocates for women's rights, the emergence of this new menace, in which women and girls are again the targets, was depressingly predictable. "This should have been addressed a long time ago," says Jin, a translator.


The US Needs Deepfake Porn Laws. These States Are Leading the Way

WIRED

As national legislation on deepfake pornography crawls its way through Congress, states across the country are trying to take matters into their own hands. Thirty-nine states have introduced a hodgepodge of laws designed to deter the creation of nonconsensual deepfakes and punish those who make and share them. Earlier this year, Democratic congresswoman Alexandria Ocasio-Cortez, herself a victim of nonconsensual deepfakes, introduced the Disrupt Explicit Forged Images and Non-Consensual Edits Act, or Defiance Act. If passed, the bill would allow victims of deepfake pornography to sue as long as they could prove the deepfakes had been made without their consent. In June, Republican senator Ted Cruz introduced the Take It Down Act, which would require platforms to remove both revenge porn and nonconsensual deepfake porn.


British female politicians targeted by fake pornography

The Guardian

British female politicians have become the victims of fake pornography, with some of their faces used in nude images created using artificial intelligence. Political candidates targeted on one prominent fake pornography website include the Labour deputy leader, Angela Rayner; the education secretary, Gillian Keegan; the Commons leader, Penny Mordaunt; the former home secretary, Priti Patel; and the Labour backbencher Stella Creasy, according to Channel 4 News. Many of the images have been online for several years and attracted hundreds of thousands of views. While some are crude Photoshops featuring the politician's head imposed on to another person's naked body, other images appears to be more complicated deepfakes that have been created using artificial intelligence technology. Some of the politicians targeted have now contacted the police.


Making deepfake images is increasingly easy – controlling their use is proving all but impossible

The Guardian

"Very creepy," was April's first thought when she saw her face on a generative AI website. April is one half of the Maddison twins. She and her sister Amelia make content for OnlyFans, Instagram and other platforms, but they also existed as a custom generative AI model – made without their consent. "It was really weird to see our faces, but not really our faces," she says. Deepfakes – the creation of realistic but false imagery, video and audio using artificial intelligence – is on the political agenda after the federal government announced last week it would introduce legislation to ban the creation and sharing of deepfake pornography as part of measures to combat violence against women.


Celebrity Deepfake Porn Cases Will Be Investigated by Meta Oversight Board

WIRED

As AI tools become increasingly sophisticated and accessible, so too has one of its worst applications: non-consensual deepfake pornography. While much of this content is hosted on dedicated sites, more and more it's finding its way onto social platforms. Today, the Meta Oversight Board announced that it was taking on cases that could force the company to reckon with how it deals with deepfake porn. The board, which is an independent body that can issue both binding decisions and recommendations to Meta, will focus on two deepfake porn cases, both regarding celebrities who had their images altered to create explicit content. In one case about an unnamed American celebrity, deepfake porn depicting the celebrity was removed from Facebook after it had already been flagged elsewhere on the platform.


Nearly 4,000 celebrities found to be victims of deepfake pornography

The Guardian

More than 250 British celebrities are among the thousands of famous people who are victims of deepfake pornography, an investigation has found. A Channel 4 News analysis of the five most visited deepfake websites found almost 4,000 famous individuals were listed, of whom 255 were British. They include female actors, TV stars, musicians and YouTubers, who have not been named, whose faces were superimposed on to pornographic material using artificial intelligence. The investigation found that the five sites received 100m views in the space of three months. The Channel 4 News presenter Cathy Newman, who was found to be among the victims, said: "It feels like a violation. It just feels really sinister that someone out there who's put this together, I can't see them, and they can see this kind of imaginary version of me, this fake version of me."


AI deepfakes are endangering democracy. Here are 4 ways to fight back

FOX News

With the recent explosion of AI, dazzling images, videos, audio and texts can now be easily generated by anyone with just a few simple inputs. While this technology offers many astonishing benefits, it also poses significant dangers. Among the most pernicious of these is the creation of deepfakes – highly realistic yet manipulated or fabricated content that falsely depicts real people doing or saying things they never did. Our ability to discern fact from fiction, along with democracy itself, are in the crosshairs. In recent months, deepfakes have entered the mainstream like never before.